To Filter Prune, or to Layer Prune, That Is the Question
نویسندگان
چکیده
Recent advances in pruning of neural networks have made it possible to remove a large number filters or weights without any perceptible drop accuracy. The parameters and that FLOPs are usually the reported metrics measure quality pruned models. However, gain speed for these models is often overlooked literature due complex nature latency measurements. In this paper, we show limitation filter methods terms reduction propose LayerPrune framework. presents set layer based on different criteria achieve higher than similar advantage over result fact former not constrained by original model’s depth thus allows larger range reduction. For each method examined, use same importance criterion calculate per-layer score one-shot. We then prune least important layers fine-tune shallower model which obtains comparable better accuracy its filter-based counterpart. This one-shot process from single path like VGG before fine-tuning, unlike iterative pruning, minimum per required allow data flow constraint search space. To best our knowledge, first examine effect metric instead multiple networks, datasets hardware targets. also outperforms handcrafted architectures such as Shufflenet, MobileNet, MNASNet ResNet18 7.3%, 4.6%, 2.8% 0.5% respectively budget ImageNet dataset (Code available at https://github.com/selkerdawy/filter-vs-layer-pruning).
منابع مشابه
To Filter, or Not to Filter: That Is the Question
Ballast water treatment technologies must meet stringent performance standards at the point of discharge. Currently, no matter the technology used, a filtration step appears crucial to remove large organisms present in the intake water, to ensure a minimum but still sufficient residual biocide or oxidant concentration, and to decrease the sediment load. Organisms potentially able to survive the...
متن کاملTo prune, or not to prune: exploring the efficacy of pruning for model compression
Model pruning seeks to induce sparsity in a deep neural network’s various connection matrices, thereby reducing the number of nonzero-valued parameters in the model. Recent reports (Han et al., 2015a; Narang et al., 2017) prune deep networks at the cost of only a marginal loss in accuracy and achieve a sizable reduction in model size. This hints at the possibility that the baseline models in th...
متن کاملTo Prune or Not to Prune: Responses of Coast Live Oaks (Quercus agrifolia) to Canopy Retention during Transplanting
A total of 62 coast live oaks (Quercus agrifolia) were monitored since they were initially boxed for transplantation in 1993. At that time, only branches injured during the moving process and deadwood were removed, leaving the entire canopy intact. This was a departure from the usual transplanting methodology that traditionally removes up to 70 percent of the canopy in order to compensate for t...
متن کاملAnts prune to prime transport networks.
Since the invention of the internal combustion engine, traffic jams have become an inevitable part of life. ‘One of the problems faced by any transportation system is dealing with changes in traffic congestion’, says Tanya Latty, from the University of Sydney, Australia. ‘Sometimes there are simply more cars on the road than the system can handle’, she explains.And ants frequently face the same...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Lecture Notes in Computer Science
سال: 2021
ISSN: ['1611-3349', '0302-9743']
DOI: https://doi.org/10.1007/978-3-030-69535-4_45